780 research outputs found

    A standards-based security model for health information systems

    Get PDF
    In the healthcare environment, various types of patient information are stored in electronic format. This prevents the re-entering of information that was captured previously. In the past this information was stored on paper and kept in large filing cabinets. However, with the technology advancements that have occurred over the years, the idea of storing patient information in electronic systems arose. This led to a number of electronic health information systems being created, which in turn led to an increase in possible security risks. Any organization that stores information of a sensitive nature must apply information security principles in order to ensure that the stored information is kept secure. At a basic level, this entails ensuring the confidentiality, integrity and availability of the information, which is not an easy feat in today’s distributed and networked environments. This paved the way for organized standardization activities in the areas of information security and information security management. Throughout history, there have been practices that were created to help “standardize” industries of all areas, to the extent that there are professional organizations whose main objective it is to create such standards to help connect industries all over the world. This applies equally to the healthcare environment, where standardization took off in the late eighties. Healthcare organizations must follow standardized security measures to ensure that patient information stored in health information systems is kept secure. However, the proliferation in standards makes it difficult to understand, adopt and deploy these standards in a coherent manner. This research, therefore, proposes a standards-based security model for health information systems to ensure that such standards are applied in a manner that contributes to securing the healthcare environment as a whole, rather than in a piecemeal fashion

    Pathologies of acute interstitial pneumonia in feedlot cattle

    Get PDF
    Citation: Valles, J. A., Apley, M. D., Reinhardt, C. D., Bartle, S. J., & Thomson, D. U. (2016). Pathologies of acute interstitial pneumonia in feedlot cattle. American Journal of Animal and Veterinary Sciences, 11(1), 1-7. doi:10.3844/ajavsp.2016.1.7Acute Interstitial Pneumonia (AIP) is a costly issue that affects feedlot cattle. Research has yet to elucidate the etiology of AIP; therefore a case-control study was conducted to evaluate possible management and physiological factors that contribute to AIP in feedlot cattle. The experiment was conducted during the summer of 2011 in a commercial feedyard in Kansas. Animals exhibiting clinical signs of AIP and a control animal from the same pen were selected for ante-mortem examination. Post-mortem AIP cases were also selected for additional examination. Ante-mortem measurements included rumen gas cap hydrogen sulfide and pH, rectal temperature and body weight. Post-mortem examination added histological examination of lung tissue. Rectal temperature was greater in the AIP cattle (40.6±0.16°C) than controls (39.7±0.16°C; p0.10). Post-mortem rumen pH values were 6.3±0.4 and 5.7±0.6 for AIP and control cattle, respectively. Histological evaluation of lung samples showed that bronchiolitis was present in about 90% of the cattle affected with AIP. About 75% of the cattle with AIP also had bronchopneumonia. No relationships between feed intake patterns, or serum amylase or lipase levels were noted between treatments (p>0.20). This study generally confirms that AIP tends to occur more in heifers relative to steers, occurs in cattle at heavier weights or later in the feeding period and tends to be associated pathologically with bronchio’ litis and bronchopneumonia. The lack of differences in rumen measures and the feed intake data between AIP and control cattle suggest that feed intake patterns and rumen fermentation may not impact AIP in feedlot cattle and that it may be more directly related to bronchiolitis/bronchopneumonia due to chronic irritation or infection. © 2016 Jose A. Valles, Michael D. Apley, Chris D. Reinhardt, Steven J. Bartle and Daniel U. Thomson

    Flight Test of an Intelligent Flight-Control System

    Get PDF
    The F-15 Advanced Controls Technology for Integrated Vehicles (ACTIVE) airplane (see figure) was the test bed for a flight test of an intelligent flight control system (IFCS). This IFCS utilizes a neural network to determine critical stability and control derivatives for a control law, the real-time gains of which are computed by an algorithm that solves the Riccati equation. These derivatives are also used to identify the parameters of a dynamic model of the airplane. The model is used in a model-following portion of the control law, in order to provide specific vehicle handling characteristics. The flight test of the IFCS marks the initiation of the Intelligent Flight Control System Advanced Concept Program (IFCS ACP), which is a collaboration between NASA and Boeing Phantom Works. The goals of the IFCS ACP are to (1) develop the concept of a flight-control system that uses neural-network technology to identify aircraft characteristics to provide optimal aircraft performance, (2) develop a self-training neural network to update estimates of aircraft properties in flight, and (3) demonstrate the aforementioned concepts on the F-15 ACTIVE airplane in flight. The activities of the initial IFCS ACP were divided into three Phases, each devoted to the attainment of a different objective. The objective of Phase I was to develop a pre-trained neural network to store and recall the wind-tunnel-based stability and control derivatives of the vehicle. The objective of Phase II was to develop a neural network that can learn how to adjust the stability and control derivatives to account for failures or modeling deficiencies. The objective of Phase III was to develop a flight control system that uses the neural network outputs as a basis for controlling the aircraft. The flight test of the IFCS was performed in stages. In the first stage, the Phase I version of the pre-trained neural network was flown in a passive mode. The neural network software was running using flight data inputs with the outputs provided to instrumentation only. The IFCS was not used to control the airplane. In another stage of the flight test, the Phase I pre-trained neural network was integrated into a Phase III version of the flight control system. The Phase I pretrained neural network provided realtime stability and control derivatives to a Phase III controller that was based on a stochastic optimal feedforward and feedback technique (SOFFT). This combined Phase I/III system was operated together with the research flight-control system (RFCS) of the F-15 ACTIVE during the flight test. The RFCS enables the pilot to switch quickly from the experimental- research flight mode back to the safe conventional mode. These initial IFCS ACP flight tests were completed in April 1999. The Phase I/III flight test milestone was to demonstrate, across a range of subsonic and supersonic flight conditions, that the pre-trained neural network could be used to supply real-time aerodynamic stability and control derivatives to the closed-loop optimal SOFFT flight controller. Additional objectives attained in the flight test included (1) flight qualification of a neural-network-based control system; (2) the use of a combined neural-network/closed-loop optimal flight-control system to obtain level-one handling qualities; and (3) demonstration, through variation of control gains, that different handling qualities can be achieved by setting new target parameters. In addition, data for the Phase-II (on-line-learning) neural network were collected, during the use of stacked-frequency- sweep excitation, for post-flight analysis. Initial analysis of these data showed the potential for future flight tests that will incorporate the real-time identification and on-line learning aspects of the IFCS

    Comment on “Discovery of davemaoite, CaSiO₃-perovskite, as a mineral from the lower mantle”

    Get PDF
    Tschauner et al. (Reports, 11 November 2021, p. 891) present evidence that diamond GRR-1507 formed in the lower mantle. Instead, the data support a much shallower origin in cold, subcratonic lithospheric mantle. X-ray diffraction data are well matched to phases common in microinclusion-bearing lithospheric diamonds. The calculated bulk inclusion composition is too imprecise to uniquely confirm CaSiO₃ stoichiometry and is equally consistent with inclusions observed in other lithospheric diamonds

    A cGAS-dependent response links DNA damage and senescence in alveolar epithelial cells:A potential drug target in IPF

    Get PDF
    Alveolar epithelial cell (AEC) senescence is implicated in the pathogenesis of idiopathic pulmonary fibrosis (IPF). Mitochondrial dysfunction including release of mitochondrial DNA (mtDNA) is a feature of senescence, which led us to investigate the role of the DNA-sensing GMP-AMP synthase (cGAS) in IPF, with a focus on AEC senescence. cGAS expression in fibrotic tissue from lungs of IPF patients was detected within cells immunoreactive for epithelial cell adhesion molecule (EpCAM) and p21, epithelial and senescence markers respectively. Submerged primary cultures of AECs isolated from lung tissue of IPF patients (IPF-AECs, n=5) exhibited higher baseline senescence than AECs from control donors (Ctrl-AECs, n=5-7), as assessed by increased nuclear histone 2AXγ phosphorylation, p21 mRNA and expression of senescence-associated secretory phenotype (SASP) cytokines. Pharmacological cGAS inhibition using RU.521 diminished IPF-AEC senescence in culture and attenuated induction of Ctrl-AEC senescence following etoposide-induced DNA damage. Short interfering RNA (siRNA) knockdown of cGAS also attenuated etoposide-induced senescence of the AEC line, A549. Higher levels of mtDNA were detected in the cytosol and culture supernatants of primary IPF- and etoposide-treated Ctrl-AECs when compared to Ctrl-AECs at baseline. Furthermore, ectopic mtDNA augmented cGAS-dependent senescence of Ctrl-AECs, whereas DNAse I treatment diminished IPF-AEC senescence. This study provides evidence that a self-DNA driven, cGAS-dependent response augments AEC senescence, identifying cGAS as a potential therapeutic target for IPF

    Supporting employers and their employees with mental hEalth conditions to remain eNgaged and producTive at wORk (MENTOR):A feasibility randomised controlled trial

    Get PDF
    Employees with mental health conditions often struggle to remain in employment. During the COVID-19 pandemic, these employees faced additional stressors, including worsening mental health and work productivity. In 2020, as part of a larger programme of work called the Mental Health and Productivity Pilot (MHPP), we developed a new early intervention (MENTOR) that jointly involved employees, managers, and a new professional (Mental Health Employment Liaison Worker (MHELW). The intervention involved trained MHELWs delivering ten sessions to employees with existing mental health conditions and managers (three individual sessions and four joint sessions) over twelve weeks. These sessions aimed to improve psychological flexibility, interpersonal relationships, and engagement of employees. This feasibility randomised controlled trial aimed to examine the feasibility and acceptability of the intervention from the perspective of employees and managers using a mixed methods approach. The intervention was largely considered feasible and acceptable. Initial findings suggest there may be benefits for employee's productivity, mental health, and managers mental health knowledge. Logistical challenges acted as a barrier to the participation and retention of participants in the trial. The major strengths of this study were the co-design and inter-disciplinary approach taken. Overall, findings suggest that this novel intervention has potential but needs some adjustments and testing in a larger sample

    Genetic insights into the introduction history of black rats into the eastern Indian Ocean

    Get PDF
    Islands can be powerful demonstrations of how destructive invasive species can be on endemic faunas and insular ecologies. Oceanic islands in the eastern Indian Ocean have suffered dramatically from the impact of one of the world’s most destructive invasive species, the black rat, causing the loss of endemic terrestrial mammals and ongoing threats to ground-nesting birds. We use molecular genetic methods on both ancient and modern samples to establish the origins and minimum invasion frequencies of black rats on Christmas Island and the Cocos-Keeling Islands. We find that each island group had multiple incursions of black rats from diverse geographic and phylogenetic sources. Furthermore, contemporary black rat populations on these islands are highly admixed to the point of potentially obscuring their geographic sources. These hybridisation events between black rat taxa also pose potential dangers to human populations on the islands from novel disease risks. Threats of ongoing introductions from yet additional geographic sources is highlighted by genetic identifications of black rats found on ships, which provides insight into how recent ship-borne human smuggling activity to Christmas Island can negatively impact its endemic species

    Acceptability of HIV self-testing to support pre-exposure prophylaxis among female sex workers in Uganda and Zambia: results from two randomized controlled trials

    Get PDF
    Background: HIV pre-exposure prophylaxis (PrEP) is highly effective for prevention of HIV acquisition, but requires HIV testing at regular intervals. Female sex workers (FSWs) are a priority population for HIV prevention interventions in many settings, but face barriers to accessing healthcare. Here, we assessed the acceptability of HIV self-testing for regular HIV testing during PrEP implementation among FSWs participating in a randomized controlled trial of HIV self-testing delivery models. Methods: We used data from two HIV self-testing randomized controlled trials with identical protocols in Zambia and in Uganda. From September–October 2016, participants were randomized in groups to: (1) direct delivery of an HIV self-test, (2) delivery of a coupon, exchangeable for an HIV self-test at nearby health clinics, or (3) standard HIV testing services. Participants completed assessments at baseline and 4 weeks. Participants reporting their last HIV test was negative were asked about their interest in various PrEP modalities and their HIV testing preferences. We used mixed effects logistic regression models to measure differences in outcomes across randomization arms at four weeks. Results: At 4 weeks, 633 participants in Zambia and 749 participants in Uganda reported testing negative at their last HIV test. The majority of participants in both studies were “very interested” in daily oral PrEP (91% Zambia; 66% Uganda) and preferred HIV self-testing to standard testing services while on PrEP (87% Zambia; 82% Uganda). Participants in the HIV self-testing intervention arms more often reported preference for HIV self-testing compared to standard testing services to support PrEP in both Zambia (P = 0.002) and Uganda (P < 0.001). Conclusion: PrEP implementation programs for FSW could consider inclusion of HIV self-testing to reduce the clinic-based HIV testing burden. Trial registration ClinicalTrials.gov NCT02827240 and NCT02846402

    The ESR1 (6q25) locus is associated with calcaneal ultrasound parameters and radial volumetric bone mineral density in European men

    Get PDF
    <p><b>Purpose:</b> Genome-wide association studies (GWAS) have identified 6q25, which incorporates the oestrogen receptor alpha gene (ESR1), as a quantitative trait locus for areal bone mineral density (BMD(a)) of the hip and lumbar spine. The aim of this study was to determine the influence of this locus on other bone health outcomes; calcaneal ultrasound (QUS) parameters, radial peripheral quantitative computed tomography (pQCT) parameters and markers of bone turnover in a population sample of European men.</p> <p><b>Methods:</b> Eight single nucleotide polymorphisms (SNP) in the 6q25 locus were genotyped in men aged 40-79 years from 7 European countries, participating in the European Male Ageing Study (EMAS). The associations between SNPs and measured bone parameters were tested under an additive genetic model adjusting for centre using linear regression.</p> <p><b>Results:</b> 2468 men, mean (SD) aged 59.9 (11.1) years had QUS measurements performed and bone turnover marker levels measured. A subset of 628 men had DXA and pQCT measurements. Multiple independent SNPs showed significant associations with BMD using all three measurement techniques. Most notably, rs1999805 was associated with a 0.10 SD (95%CI 0.05, 0.16; p = 0.0001) lower estimated BMD at the calcaneus, a 0.14 SD (95%CI 0.05, 0.24; p = 0.004) lower total hip BMD(a), a 0.12 SD (95%CI 0.02, 0.23; p = 0.026) lower lumbar spine BMD(a) and a 0.18 SD (95%CI 0.06, 0.29; p = 0.003) lower trabecular BMD at the distal radius for each copy of the minor allele. There was no association with serum levels of bone turnover markers and a single SNP which was associated with cortical density was also associated with cortical BMC and thickness.</p> <p><b>Conclusions:</b> Our data replicate previous associations found between SNPs in the 6q25 locus and BMD(a) at the hip and extend these data to include associations with calcaneal ultrasound parameters and radial volumetric BMD.</p&gt

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
    corecore